Regularized Generalized Canonical Correlation Analysis
نویسندگان
چکیده
منابع مشابه
Multiway Regularized Generalized Canonical Correlation Analysis
Regularized Generalized Canonical Correlation Analysis (RGCCA) is currently geared for the analysis two-way data matrix. In this paper, multiway RGCCA (MGCCA) extends RGCCA to the multiway data configuration. More specifically, MGCCA aims at studying the complex relationships between a set of three-way data table.
متن کاملRegularized Generalized Canonical Correlation Analysis Extended to Symbolic Data
Regularized Generalized Canonical Correlation Analysis (RGCCA) is a component-based approach which aims at studying the relationship between several blocks of numerical variables. In this paper we propose a method called Symbolic Generalized Canonical Correlation Analysis (Symbolic GCCA) that extends RGCCA to symbolic data. It is a versatile tool for multi-block data analysis that can deal with...
متن کاملThe RGCCA package for Regularized/Sparse Generalized Canonical Correlation Analysis
2 Multiblock data analysis with the RGCCA package 1 2.1 Regularized Generalized Canonical Correlation Analysis . . . . . . . . . . . . . . . . . . . . . . . . 2 2.2 Variable selection in RGCCA: SGCCA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.3 Higher stage block components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.4 Implementatio...
متن کاملKernel Generalized Canonical Correlation Analysis
A classical problem in statistics is to study relationships between several blocks of variables. The goal is to find variables of one block directly related to variables of other blocks. The Regularized Generalized Canonical Correlation Analysis (RGCCA) is a very attractive framework to study such a kind of relationships between blocks. However, RGCCA captures linear relations between blocks an...
متن کاملDeep Generalized Canonical Correlation Analysis
We present Deep Generalized Canonical Correlation Analysis (DGCCA) – a method for learning nonlinear transformations of arbitrarily many views of data, such that the resulting transformations are maximally informative of each other. While methods for nonlinear two-view representation learning (Deep CCA, (Andrew et al., 2013)) and linear many-view representation learning (Generalized CCA (Horst,...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Psychometrika
سال: 2011
ISSN: 0033-3123,1860-0980
DOI: 10.1007/s11336-011-9206-8